Simultaneous Variable Selection
نویسندگان
چکیده
We propose a new method for selecting a common subset of explanatory variables where the aim is to explain or predict several response variables. The basic idea is a natural extension of the LASSO technique proposed by Tibshirani (1996) based on minimising the (joint) residual sum of squares while constraining the parameter estimates to lie within a suitable polyhedral region. This leads to a convex programming problem for which we develop an efficient interior point algorithm. The method is illustrated on a data set with infra-red spectrometry measurements on 14 qualitatively different but correlated responses using 770 wavelengths. The aim is to select a subset of the wavelengths suitable to use as predictors for as many as possible of the responses.
منابع مشابه
Bayesian Predictive Simultaneous Variable and Transformation Selection in the Linear Model
Variable selection and transformation selection are two commonly encountered problems in the linear model. It is often of interest to combine these two procedures in an analysis. Due to recent developments in computing technology, such a procedure is now feasible. In this paper, we propose two variable and transformation selection procedures on the predictor variables in the linear model. The r...
متن کاملVariable Selection in Single Index Quantile Regression for Heteroscedastic Data
Quantile regression (QR) has become a popular method of data analysis, especially when the error term is heteroscedastic, due to its relevance in many scientific studies. The ubiquity of high dimensional data has led to a number of variable selection methods for linear/nonlinear QR models and, recently, for the single index quantile regression (SIQR) model. We propose a new algorithm for simult...
متن کاملA method for simultaneous variable selection and outlier identification in linear regression*
We suggest a method for simultaneous variable selection and outlier identification based on the computation of posterior model probabilities. This avoids the problem that the model you select depends upon the order in which variable selection and outlier identification are carried out. Our method can find multiple outliers and appears to be successful in identifying masked outliers. We also add...
متن کاملVariable Selection via Penalized Likelihood
Variable selection is vital to statistical data analyses. Many of procedures in use are ad hoc stepwise selection procedures, which are computationally expensive and ignore stochastic errors in the variable selection process of previous steps. An automatic and simultaneous variable selection procedure can be obtained by using a penalized likelihood method. In traditional linear models, the best...
متن کاملVariable Selection in Nonparametric and Semiparametric Regression Models
This chapter reviews the literature on variable selection in nonparametric and semiparametric regression models via shrinkage. We highlight recent developments on simultaneous variable selection and estimation through the methods of least absolute shrinkage and selection operator (Lasso), smoothly clipped absolute deviation (SCAD) or their variants, but restrict our attention to nonparametric a...
متن کاملAn Improved 1-norm SVM for Simultaneous Classification and Variable Selection
We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only ofte...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Technometrics
دوره 47 شماره
صفحات -
تاریخ انتشار 2005